Dick Mills on computer programming: "If it ain't tested, it doesn't work"

greenspun.com : LUSENET : TimeBomb 2000 (Y2000) : One Thread

I asked Dick Mills how he can maintain his position that Y2k will not cause tremendous problems in light of his strongly held belief that, "If it ain't tested, it doesn't work". It just seems inconceivable to me that this respected computer expert can hold such a stong opinion for a single programming job, but when confronted with the certainty that millions and millions of INTERCONNECTED programs WILL NOT BE TESTED IN WAYS THAT DUPLICATE THEIR DAILY USE, that he shrugs his shoulders and begins one petty argument after another. Here's what I mean...

My old programmer's rule is, "If it ain't tested it doesn't work."

Dick:

Please explain to me how you can repeat this phrase over and over again and yet ignore the fact that there is virtually NO END-TO-END-Y2K TESTING GOING ON, and please don't refer me to one of your "Lessons Learned" articles -- as if it all can be easily explained away. Anyone who has been following Y2k developments knows that we are incredibly vulnerable, and yet you continue to spout your middle-of the-road ("inconveniences, but no calamities") predictions for next year. It simply doesn't hold water with those of us who know the score. Basically you have marginalized yourself to no more than a government lackey spouting the party so the sheeple won't panic.

For one who gives the appearance of scrupulously guarding his credibility how can you justify this position. If people "panicked" earlier, SOME would have had a chance to prepare adequately.

By the time they realize they been had, the panic will go beyond anything imaginable, and you'll have to live with THAT for the rest of your life.

Dr. Roger Altman

was about to respond to your first sentence and try to explain, but the rest of the letter indicates that you won't like anything I write. So please explain for me

>It simply doesn't hold water with those of us who know the score.

What is your source of illumination? Are you one of Gary North's dingalings?

What are your qualifications for judging?

What *is* the score?

The only truly end-to-end test for Y2K would involve every chip in the whole world. How would you propose doing that? If the test failed how would it be less damaging than the real Y2K?

-- Dick Mills www.albany.net/~dmills dmills@albany.net

Dick:

Common, you are not entitled to pull that one. Remember the OLLLLLLLD PROGRAMMER'S RULE, If it ain't tested, it doesn't work.

So it's up to the Pollyanna's (tough minded Dick Mills a Pollyanna? PERISH THE THOUGHT!!!) to demonstrate that enough END-TO-END testing HAS BEEN COMPLETED TO EVEN BEGIN TO CLAIM THAT THERE WILL BE ONLY INCONVENIENCES AND NOT CALAMITIES du jour.

Sorry Dick, based on your own reasoning, (you know, that ollllllllld programmer stuff again) it's up to YOU to show ME the evidence of successful end-to-end testing. Otherwise, "It doesn't work", remember? Hell, I'm so certain that we're witnessing the biggest PR campaign in the history of the world that I'll be surprised if you could show me ANY evidence that such tests have been PLANNED let alone attempted, and (now here's a laugh), successfully completed.

So until I hear differently, I really think you should at least be consistent with your "inconvenience" Y2k position and amend your slogan to,

"If it ain't tested IGNORE IT!"

Give it your best shot, but, frankly, I'm not expecting much from you other than more silly semantic arguments. For a so called, hands on, practical engineer I would have thought such intellectual hand waving would have been beneath you. So if that's the kind of reply you plan to write, well, in that regard, I did indeed misjudge you.

Dr. A

I won't take your attempts to bait me until you tell your qualifications to judge the answer. Do you know anything at all that you haven't read on the Internet?

-- Dick Mills www.albany.net/~dmills dmills@albany

Dick:

Oh, I get it. I have to have some threshhold level of credentials in order to receive information that is just too complex for the average dufus to understand. So, if I qualify by having the 'correct' formal education, preferably through the doctoral level, combined with many years of 'the right kind' of on the job experience and specialized training, then, and only then, would you provide me with your supporting evidence.

You know Dick, I think you are one of the few people left on this planet who has demonstrated the capability to make me even more cynical that I was BEFORE we started this exchange.

If these emails represent the best that Dick Mills has to offer, then we really are headed for the 'ol crapper'. Could you, at least, throw me a few crumbs of enlightenment before I decide to end it all right here and now?

Plaeeeeeze, my life is in your hands,

Dr. A.

Can someone explain what the hell is going on here?



-- Roger Altman (RogAltman@AOL.com), October 06, 1999

Answers

This is indeed a rather disturbing response from Dick Mills, whom I certainly have always respected, though often disagreed with. As an engineer, he certainly should not stoop to such an evasive (not to mention emotional) response. He should also be every more concerned about testing, in view of the recent failure of the Mars probe due to the incompatibility of English versus Metric measurements that offer a living example of what happens when you have a complex system depending on interfacing various components -- especially if they have not been tested as a single, cohesive system.

86 days.

Y2K CANNOT BE FIXED!

-- Jack (jsprat@eld.~net), October 06, 1999.

Dr. Altman,

I'm not real qualified to debate with you on the issue, other than having worked on a Y2K desktop remediation project for the last ten months in my job as that lowly form of life known as a PC technician. I really don't know much about the embedded systems or end-to-end testing that you're referring to. But I did want to add a couple of comments:

1. As Mr. Mills (whom I've met in person here in Albany, NY, by the way) stated, the level of comprehensive, end-to-end testing that you propose doesn't seem feasible. Your views seem to indicate the "100% ready or we're toast," all-or-nothing mentality or viewpoint about worldwide remediation efforts. I don't know that the comfort level you seek about testing will ever be felt.

2. What are your motives here? Dick Mills seemed like an even- tempered guy when I met him at a Y2K interest group meeting; a couple of emotionally-charged attendees of that meeting were unsuccessful in their attempts to draw Dick into a sensationalized, apocolyptic discussion of the issues surrounding Y2K. Are your posts directed at Mr. Mills intended to incite barb-laden verbal sparring, or are they intended to provoke civil, friendly discussion? (Once again, I met Mr. Mills only once, and for only a short period of time, so I certainly don't portray myself as his colleague or friend. I do respect his knowledge in the IT field). If we are all in as much trouble as you say, what will on-line episodes like your debates with Dick accomplish? You are probably the same type of even- tempered, mild-mannered individual if I were to talk to you in person, but your posts don't portray that at all. I intend you no disrespect whatsoever, as with anyone else.

I welcome any response to either the forum or my e-mail account. Thanks very much.

-- Larry Goldberg (goldberg_lawrence@hotmail.com), October 06, 1999.


Roger,

He could be a covert elitist (like De Jager). Its obvious that he has no intentions of answering your question because it would expose the inconsistencies between his general heuristical knowledge (rules of thumb gained from experience) and his present public opinion on Y2K outcome.

It is telling that he implies that there is NO WAY to test until the roll over because any system wide testing would do harm prematurely at the same level as Y2K itself.

Consider that for a minute. If all was remediated perfectly then 'no problem' in doing system wide testing earlier. He is saying/implying that Y2K WILL BE the test of all these repair efforts and that the potential for failure is large enough that the folks in charge can not allow the problems to surface before they are forced to surface.

That is the rationalization being used throughout the USA and world for not doing intersystem testing NOW. Ergo we will contend with massive failures and domino effect failures well into 2000 on all fronts. This follows because testing finds 85% of the bugs and use finds the other 14% (with 1% remaining to keep the programmers employed now and forevermore, amen). No system wide testing means that ALL failures will be front and center on day one of usage (a rate 600% greater than if testing had been performed and bugs killed before use).

Businesses as a rule will not disrupt their daily operations by choice. They will not throw the switch until they must. They have made decisions about remediation, testing and contingency planning which do not allow disruptions. In tightly integrated enterprises such as chemical plants/oil refineries or JIT type systems this means that any efforts will be simply nibbling around the edges and the rest will be FOF (Fix On Failure). This has been admitted publicly for a long time, I think.

Just reviewed our hospital's Y2K statement. It LOOKS good, but under the covers its just as bad as anything you've seen here. No significant continuency planning, use normal 'disaster plan', no 'stock piling', no senior management involved in the project on any level whatever.

When TSHTF there are going to be a huge number of folks who will say that they 'did the best that they could'. There will be a tremendous effort to cast this as a 'social failure' rather than to hold individuals accountable for their negligence. I just hope that I can stay far enough away from these people as I can.

God help us all.

-- ..- (dit@dot.dash), October 06, 1999.


Gawd, its just so wonderful that everybody is concerned about being even tempered and reasonable and stuff. Its male bonding at its finest, to be sure.

Meanwhile, the plain fact is that there is a lot of complex hardware, software and firmware that is supposed to interact, which has a BUILT-IN DESIGN FLAW known as the Y2K problem. Maybe its all been corrected, piece by piece. (LOL!)

But will it all hang together come January 1?? Is the bear Catholic? Does the Pope shit in the woods?? Is 1 inch = 1 centimeter???

-- King of Spain (madrid@aol.cum), October 06, 1999.

Dr. Altman

In short, I can't speak for embedded systems. I'm strictly an application programmer. Having said that, I agree from my experience that if my programs aren't tested, there's a good chance (99%) they won't work. My applications are too big and complicated for test skipping. Without getting into any kind of a debate here, this fact alone should be enough to prepare. An economic impact is possible, just by defective application code, excluding embedded systems. We know the testing phase is suffering. The fact that fixes are still going on tells me many applications will not be tested. This is plenty of evidence for me and it should be enough for anybody else, especially Dick Mills. I can't understand his thinking either.

-- Larry (cobol.programmer@usa.net), October 06, 1999.



Everybody stop worrying.

End-to-end testing will begin in 86 days...

-- Uncle Bob (UNCLB0B@Y2KOK.ORG), October 06, 1999.


Roger:

Do we see another straw man here? An "all or nothing" pile of garbage? Either we test ALL of the systems end to end, or we test NONE of the systems.....

Well, we all know that all of the systems cannot be checked prior to the singularity. Thus, using this logic, it *doesn't matter* whether we check ANY of them end to end............

-- mushroom (mushroom_bs_too_long@yahoo.com), October 06, 1999.


Larry:

Yes, I would like to be even tempered and maintain a "respectable" dialog with Dick Mills. But many people read his "bump-in-the-road Y2k opinions and go back to sleep without any thought of preparing for an outcome that, IMHO, will be far, far worse the what Mr. Mills has suggested. So my frustration builds.

As far as insisting that anything but complete end-to-end testing is required, well that's simply preposterous. What I'm asking for is evidence of ANY end-to-end testing on WHATEVER SCALE IN WHATEVER INDUSTRY OR ACROSS INDUSTRIES OR WHEREVER AND WHENEVER. In other words, just where are we with regard ANY KIND of end-to-end testing. Let's get it out in the open. Has there been ANY at all. IMHO, there has been only superficial end-to-end testing. This is accomplished by SELECTING A FEW OF THE MOST Y2K COMPLIANT companies for maximum PR within a given industry for "end-to-end" testing. I hope I'm wrong about this. If I am then Dick Mills or anyone else please correct me WITH SPECIFIC FACTS.

However, if my hunch is right, then we really are in deep dodo don't you think? And remember we owe it to ourselves to compare the actual level of Y2k testing to the way the 'network' actually runs day in and day out. Just how close are Y2k tests simulating the 'network'?

Roger

-- Roger Altman (RogAltman@AOL.com), October 06, 1999.


C'mon King of Spain!

The sensationalist nature of much of the Y2K discussion has obscured the reasoned approach in most cases, and that's why I made reference to a more civil tone. Call me a moron for that... What I was trying to express was that there will be no comfortable, warm-fuzzy feeling for those of us in the all-or-nothing, it's-one- huge-big-ole-conspiracy camp. We're gonna all live through this, and hopefully learn lessons from it, but there is no absolute answer. Oh, and also, being civil has nothing to do with male bonding; I've gotten ass-whuppings from female posters as well!! Good day sir!

-- Larry Goldberg (goldberg_lawrence@hotmail.com), October 06, 1999.


It's an excellent policy that if it ain't tested, it don't work. But there are many levels of testing, and there are strict limitations to testing. These can be explained in great detail to anyone interested in them.

But trying to start a dialog about testing by saying "Only the impossible is good enough, and why are you so stupid you can't realize this?" isn't going to get you very far. If complete end to end testing were required, it would have been impossible to get where we are today, since such testing has never been done -- everything was incremental.

-- Flint (flintc@mindspring.com), October 06, 1999.



"...everything was incremental...."

YES! YES! YES!

Go to the light, Flint! You've made it!!

A world of data processing that was built incrementally is going to have to be live-tested instantaneously.

You are there, man. Welcome to the fold.

-- lisa!!! (lisa@work.now), October 06, 1999.


Dr. Altman, I do not know what type of doctor you are, but I know what type I am. I do not blame you for your level of discomfrt with Mr. Mills assertion that he will not explain his opinion until he determines if your qualified to judge his opinion. If I told any patient of mine that I would not explain their medical condition or justify my opinion in any way until they proved to me that they were qualifed to understand the answer, I would be out of practice in days. Those of us in medicine who behave in this way are know as a "pompus horses ass." That is the technical medical term and I will not explain it to Mr. Mills until he proves to me that he has the qualifications to unde

-- smfdoc (smfdoc@aol.com), October 06, 1999.

Flint:

Yes indeed "everything WAS incremental", i.e., before Y2k. Now we are being forced to compress 40 years of incremental growth and system evolution into (at best) one order of magnitude less time. There are many of us on this forum who believe that THAT IS IMPOSSIBLE even under the best of circumstances, and even you would admit that we have anything BUT the that.

Roger

-- Roger Altman (RogAltman@AOL.com), October 06, 1999.


Geeee, I guess we overlooked that one small factoid - no live end to end testing. And if it ain't tested, it don't work. Really? Is that true? You mean, if it ain't tested it don't work? So, if it ain't tested, it don't work. Am I missing something? Does that mean that if it ain't tested, it don't work? Am I sounding like a polly yet? Thank God SOME people have enough bloody common sense to see through this crap. All I can say is, DICK MILLS, GET A LIFE...

Owl

-- owl (w@a.com), October 06, 1999.


I always thought that Dick Mills had some good sense and enjoyed his writings. On the other hand I always thought he had better people skills than this. The nature of his responses undermines here undermines his credibility, IMO. If he were as high and mighty as he really was, DM would not feel the need to be so impulsively derisive.

Flint makes a good point here. I agree with him that NO amount of testing will satisfy some people's lack of belief. On the other hand, there really hasn't been enough testing (period) in most industries, even the type-testing kind. If this were so, we would not be seeing the high rates of y2k failures already reported in corporations.

I will be surprised if we see prolonged y2k power outages from embedded failures. (Surprised, maybe, but not unprepared...could be that everybody will be trying to use the phones--tying them up--at the same time domestic terrorists are hacking down powerlines at the same time New York is getting dusted with anthrax...) On the other hand, it could be that the very measures to circumvent power failures may create unstable conditions in the power grid, but that's just admittedly a wild-ass guess. What if a bunch of utilities preemtively "island" themselves from the grid?? Wouldn't that make things a bit flickery and unstable?

-- coprolith (coprolith@rocketship.com), October 06, 1999.



For what its worth,

1.)The phrase "end to end testing" implies a 2 dimensional process. Start at the beginning of a line at point A and proceed to point Z. Unfortunately, most large, and not so large software entities (like a payroll or inventory or shipping system) are better visulized as a 3 dimensional dynamic organism. The boundaries of such systems are often ill-defined. Therefore, figuring out what contitutes the beginning and end points of an "end to end" test is far from easy.

2.) The embedded components of certain real-time systems (air traffic control comes to mind) are essentially impossible to model or test for unified date function. Small errors can have a magnified effect when multiple systems are linked. The FAA "test" of a single aircraft going against a single control center is virtually meaningless.

The absence of "end to end" testing doesn't necessarily mean failure. It does mean there isn't enough data to accurately predict the behavior of complex large-scale systems. The anxiety of people like Hamasaki revolves around their experience of how fragile these systems can be when software changes are made in isolation from one another. The thought of a simultaneous change across all platforms is simply ominous in the extreme.

-- RDH (drherr@erols.com), October 06, 1999.


Well I hate to sound like Bill Clintonbut it depends on how you define end to end.

In the Electric Utility business end to end testing usually means from Station A to Station B. However you could look at it as having to test from the generating station to your house. In the real world no one does that. Even in end to end testing as defined above not everything is tested. You just test the equipment that is associated with the line you are working on. You dont test all equipment on all lines for all changes.

Coprolith: I know of only one utility that has plans (and maybe they changed their mind by now) to pre-island. But it is a special case. They have very little load, few interconnections and enough hydro to meet their (limited) needs. There are a lot of reasons not to pre island. The two reasons that you mentioned are very valid. There are also contractual and legal reasons.

Smfdoc: Sorry but I disagree with you. First you would probably know your patients background before giving the diagnoses. While you would convey the same information I dont think you would describe the diagnoses the same way to someone whose doctorate was in Art History as you would to someone holding a doctorate in Bio-Chemistry, or a fellow MD. And you would want to know if you were dealing with a hypochondriac.

Roger: I notice you said in another thread that your Ph.D. was in metallurgy. Since you wrote to Dick I assume your comments were about the Utility business, as opposed to everything with a computer. You fallacy is in your assumption that there are millions and millions of programs all interconnected. It is not true for getting the power to your house. Unless you are saying that everything has to work for everything to work?

-- The Engineer (The Engineer@tech.com), October 06, 1999.


* * * 19991006 Wednesday

Finally! { "2001: A Space Odessy" theme in the background... }

Ground Zero: "If it ain't tested, it don't work!"

YESSSS!! INDEED!

In my 33+ years of extensive ( i.e., mainframe, PC/Network, embedded [controllers] ) software/systems work, I've never implemented, nor have I ever seen implemented, any system/application/integrated component without (reasonably exhaustive) modular and end-to-end testing! Never, ever!

I'd have been fired for that, or fired anyone that tried to pull it off! No questions or reservations!

For the telecommunications infrastructure to collectively claim to be anywhere near "Y2K Ready" based on pseudo-testing in a simulated laboratory environment is prima facia ludicrous and criminal! There is no feasible possibility that samples of _all_ switching components in service today--incrementally installed/implemented over decades-- could be replicated in a laboratory. Ergo, the alleged Y2K telecom laboratory scenario as a viable methodology is abjectly bogus.

This, my friends, is the unmitigated and unadulterated false premise ( Bull capital-S with a HIT!) illustrious leaders and wishful, ignorant, thinkers in all facets of our culture/society have spoon- fed to the even more willingly ignorant and gullible sheople/masses!

Y2K consequence severity and grotesque deceit/distortion has attained the distinguished status as the greatest--in scope and magnitude-- hoax perpetrated upon global humanity.

Humanity, for future posterity, should never forget, nor forgive, the perpetrators!

Regards, Bob Mangus

* * *

-- Robert Mangus (rmangus1@yahoo.com), October 06, 1999.


Hey, Bob, how's Southwestern Bell??

Anyway, Bob, can you explain how any new systems are installed in the telecommunications industry, without complete failure? I mean, aren't they using the same testing procedures they use for any system implementation?

-- Hoffmeister (hoff_meister@my-deja.com), October 06, 1999.


Off

-- Hoffmeister (hoff_meister@my-deja.com), October 06, 1999.

I got a tip that a Dr. Altman was improperly posting my private emails on this forum. I'm perfectly capable of making my own public postings, and when I reply to someone privately netiquette says that it should stay private.

I try to be civil with everyone even when their mail to me is belligerent and insulting as Dr. Altman's was. In this case, his taunting tone as well as his misrepresentation of my position got my goat and I decided to blow him off. It takes a lot of my time to compose a custom answer for an audience of one especially when it's obvious that he hasn't read or hasn't understood what I've written before. For the record. I've always said that we should expect the power grid to fail. Go back to June 1998, Training to Avoid Y2K Power Problems, http://www.wbn.com/y2ktimebomb/PP/RC/rc9826.htm

My point has always been that we can restart it manually and get the lights back on within 72 hours worst case. I haven't wavered on that in the past 14 months because my estimates never were based on the concept of Y2K readiness. They are based on how to keep things running despite malfunctioning equipment. If Dr. Altman believes that I think that things won't fail, he either never read much of my stuff or he chooses to misrepresent it.

In fact one of my criticisms of the utilities has been that they didn't prioritize the way I would have. If I were running the show, I would make contingency plans number one priority and all remediation and testing number two. Being Y2K ready is not the point. Keeping the lights on is. The utilities claimed that they could accomplish both in the time and money available and therefore could afford to rank both as priority number one.

If there are any power companies in the world who haven't started Y2K work yet, (I haven't heard of any), my advice would be simple. Uninstall all suspect digital devices right away and get used to doing without them. That way, they stand a better chance of avoiding blackouts during the rollover and can reinstall new or repaired devices as time permits after rollover. Again, the priority is lights on, not Y2K ready.

As for end-to-end testing, I consider it just a silly play on words. Anyone who claims to have run an end-to-end test in the past can be challenged on the basis that the end markers could have been set wider. A company that claims to have done an end-to-end test on its software, could have tested it together with all its supplier's and customer's software. The only time when one's definition of end-to- end can not be challenged as not being end-to-end enough is when it covers everything everywhere.

Ontario Hydro ran a test in which they took most of the province of Ontario, including the whole city of Toronto and rolled all their clocks forward. I don't know how comprehensive their definition of "all" was, but that's a lot more end-to-end than most tests. No matter. The test could have been broader so it wasn't really end-to- end.

When it comes to big things like power plants, or oil refineries, there can never be such as thing as a truly comprehensive test. There never was any such test when it was first commissioned and everything was simultaneously brand new and unproved. But the components in those plants have now had years of daily end-to-end actual use. In commercial systems, we can test things like bank software in a test environment not using real money. In physical systems like airplanes and nuclear power plants there is no test mode. Somebody has to fly the plane or start the plant. I call that use, but you can call it testing.

-- Dick Mills (dmills@albany.net), October 06, 1999.


Mr. Mills:

I believe they call those first airplane flights "test flights". They are flown by a very special breed of pilots known, oddly enough, as "test pilots".

Test pilots, as any other pilot will tell you (at least according to my past readings, I have never known one personally. I take that back. I did meet a ex Navy Seal who tested parachutes by jumping out of planes into a salt lake in California. He was definitely nuts, even when I met him, years later) are ALL nuts.

Test flights result in crashes, all too often. NO ONE takes paying commercial passengers on a test flight.

Power plants are brought up carefully, off line, are they not? Truly, I fail to see the validity of your arguments. Straw men, sir. Straw men.

Now, the Titanic was tested "on line". Our space shuttles, complicated as they are, are still being tested "on line". Some troubles, one catastrophe, many scares.

For it's time, the Hindenburg was a complicated system. Worked great for some time. Then disaster. Under different circumstances, might have never exploded, or might have

Please, let's not forget chaos. We are dealing here with *extremely* complex systems which cannot be pinned down into two dimensions. Maybe not three (add time as the fourth). We have introduced many changes in a short period of time. Should we not, as you just indicated, expect problems of some sort?

I most fervently hope that you are completly correct that any problems in the electric grid can be brought back up and worked around in 72 hours or less. It would make my life much easier. But, even if there are NO power failures anywhere, there are many other complex systems at risk, from the telcos to the IRS to the Post Office to Medicaid.

Not to mention GM and Ford and 3M and 497 other Fortune 500 companies, less than 20% of whom are finished at this point (although only 9% expect that to cause any problems. Sounds like my kids with their homework...)

I am not attacking. I'm a layman. I know that things foul up under the best of circumstances. I doubt very much that next year will qualify as the best of circumstances. I wish there had been time to do more integrated testing of larger collections of systems, even if the Holy Grail of "true" end to end testing remained ultimately unattainable.

-- mushroom (mushroom_bs_too_long@yahoo.com), October 06, 1999.


(1) Unlike Bob, above, in my 1/3 century in IT, I have seen a number of systems installed without full integreation (end-to-end) testing. It is no surprise that these were disasters! In some cases, teams "met" their deadlines by installing systems without testing them. They would then work 'round the clock to fix on failure...but they met their installation deadline! (My personal bias is to test everything as thoroughly as is humanly possible...)

(2) Errors found in system integration (read: end-to-end) testing usually are more difficult and cost far more to fix than errors found earlier in the development and test cycle.

-- Mad Monk (madmonk@hawaiian.net), October 07, 1999.


Moderation questions? read the FAQ