Grimm's Tale of an Overlooked Test Case
I'm bored, so a story...
Once upon a time, a certain inventory report was needed by the customer, to replace the original report that was eaten by the printer. (Weren't those the days?) I was still a newbie, so I asked the product architect, who I shall call Grimm, how to do what was needed.
Grimm explained, "Rerunning the program is no problem; it can be run any time. But you need to make sure it doesn't do any updates because those should only be done during the nightly run. It has two update processes in it controlled by flags. Just set the flags to 'N' and the program will only produce the report."
Then he added, "Be sure you don't set them both to 'Y': The updates are not compatible; if you run them at the same time, the program will destroy the database."
I did exactly as I was told: I carefully ensured both control flags were set to "N" and then ran the program...and...disaster.
After the damage was corrected by restoring the database, and the user outage was relieved, Grimm began reviewing my setup, as I waited in dread for the verdict.
It turned out the cause of the failure was Grimm's "clever" code to prevent both processes from being run simultaneously:
IF NOT RESET-INVENTORY-FLAG = 'Y' PERFORM PROCESS-INVENTORY-AGE THRU PROCESS-INVENTORY-AGE-EXIT END-IF IF NOT INVENTORY-AGE-FLAG = 'Y' PERFORM PROCESS-RESET-INVENTORY THRU PROCESS-RESET-INVENTORY-EXIT END-IF
I think maybe Grimm overlooked a test case...
Was old-timey code in all caps because it uses less bits than all-lowercase?
Standards, for this company, and those were due to tradition.
It's largely been forgotten these days, but lower case was rarely used on computers prior (in my experience) to 1978. Card punches were upper case. Unit record equipment was upper case. Printer chains were upper case. For that matter, a lot of video terminals could not do lower case.
I even used a computer (CDC 6400) that had a 6-bit character set that originally included no lower case characters (no space in the code set, which had 64 code points of course). By the time I worked on it, they had enhanced the character set to use 76 octal as a shift to mark 12-bit characters for lower case. Weirdest arrangement ever: the basic characters were six bit because words were 60 bits ... 10 characters to the word, null terminated with a kicker. So in octal, "COYNETHEDUP" was:
And "CoyneTheDup" would be:
03761776277616760524 76107605047625762000 00000000000000000000
Why the extra word? Because the rules said there must be 12 bits zero in a row to terminate a string, octal 0000, so in this example, an extra word is needed. With all that complexity and extra memory requirement, oddly enough they still used upper case a lot.
Imagine if they had changed this thing to do Unicode...