Select Page
201105160824.jpg

Hello and welcome back.

I recently posted some of the reasons I prefer Macs over other computers and Operating Systems. I’m not totally biased, though, I intended to post an article on Windows. However, in discussion with my friend Peter Saint-Clair, we decided he might be better able to write that article.

And this is the article, right?

No, I’m afraid not. Peter had a problem. His hard drive crashed.

For real? Crashed, as in the hard drive was corrupted?

I don’t know the details, but yes, it wasn’t a simple matter of rebooting.

Ain’t that ironic. He’s writing an article to glorify the computer, and it trashes his hard disk.

It is ironic, but it was inevitable.

Nigel, that’s being a bit unfair. That sort of thing can happen to anything and anybody.

Yep, but not for the reasons you might be thinking.

It’s time I let you in on the secret of the Criticality Detector.

The what?

It’s one of engineerings most innovative and secret achievements. Criticality detection is a secret branch of engineering. It started back in stone age times when the wooden handles on an axe would break at the most inopportune times.

Like when a dinosaur attacked?

Not quite. The dinos were 65 millions years earlier. But you get the idea. Just when it was desperately important the axe worked, it didn’t.

Unwittingly, stone age man had uncovered the first criticality detector. The principal mechanism of the critical failure at that time was simple wear and tear, but it wasn’t long before engineers were hard at work refining the process.

Why?

Legend has it that the first stone age engineers built a years supply of axes in a week, leaving them out of work for the next 51 weeks. But, by making the axes fail regularly, they could keep eek out a decent income in repairs and upgrades.

Surely the people cottoned on to this scheme?

They did, and it wasn’t a good time for engineers. The common man revolted. There were riots in the hills, cats and dogs were living together, and engineers were hacked down by angry mobs.

These dark days left such an impression on the human psyche, engineers are never laid-off, they get axed.

Ouch.

Yeah, dark days indeed. But engineers are a resourceful bunch. They retreated and reorganized (at least the ones that didn’t get axed did). Next winter, when all the old axes had worn out, they were back with a new and improved model. They gave warranties, carefully worded, of course, but the advertising pitch was good. Soon engineers were accepted back into society.

Accepted into society?

Well, the fringes of society, but that was close enough for the engineers.

Engineers thrived. Everything they sold, wore out eventually. Their business was safe. Millennia passed with problem. Right up until Edison invented the lightbulb.

Yeah, they’re always wearing out.

Exactly. It’s even reputed that Edison perfected the light bulb on his first attempt and the other 999 tries were simply to optimize the wear process to limit the bulbs lifetime to meet his factory’s schedule.

And did people start to notice this?

You bet. There was a collective realization that the whole electric lighting thing was rigged. Historians now refer to that point in time as the “lightbulb moment.” The word “axed” got thrown around as well.

The trouble was, everything engineers did to try and makes things better, made things worse. It seemed like fate was playing a hand.

Engineers were staring down a double barreled crisis. Angry mobs were after them again and none of their inventions worked.

It was time for something drastic.

Fortunately for engineers, a little known mathematician, from Little Wallop in the north of England, was working on something called crisis theory.

Crises have a theory?

Crisis theory grew out of the idea that crises themselves generate a field. The bigger the crisis, the stronger the field. The stronger the field the more reality is distorted and the worse the crisis gets.

Once reality is sufficiently distorted, normal physical laws no longer apply. Apples can fall upwards. Trees can fall in the forrest without making a noise. All the blacks in your closet suddenly match. Bank accounts fill up the more money you take out of them.

Whoa! Wait a minute. Banks accounts fill up?

Nah, I just threw that in to see if you were paying attention, but the other stuff actually happened.

Engineers seized upon Crisis theory. In a matter of months it had replaced “wear and tear” as the preferred method for ensuring continued employment.

The trouble was that theory was one thing, reality was another. Careers were made and broken as engineers tried to make products using this new idea. Underground jobs posting boards were filled with exorbitant salaries for anyone with the necessary skills to implement crisis theory into products.

And just how did they implement crisis theory?

Ah, well. That’s a story in itself. The main problem with crisis theory was its complexity. People were known to have nervous breakdowns just trying to read the title page. There was a whole wing of a private “engineers only” hospital set aside for page 13 readers. It wasn’t a pretty sight.

So did they give up?

Were talking about engineers here! They were spending other people’s money! What makes you think they would they give up?

No, they didn’t give up, but what they did realize was that crisis theory was so complex, it generated crisis fields intense enough to stop normal laws applying. This prevented them implementing it.

Their saving grace came in the form of quantum theory.

Quantum what?

Quantum theory is based on the idea that things only exist in discrete states. It was being developed for subatomic particles, but engineers realized its applicability to crises.

Are you saying subatomic particles have crises?

I don’t know, but what the engineers realized was that crises exist in only two states: mindless-panic or ignorance-is-bliss. What’s more, crises exist in both states until a measurement is made. At that point the crisis will decide which state it should be in.

Wait a minute! The crisis is in both mindless-panic or ignorance-is-bliss at the same time? That’s a tiny bit mind bending.

Yeah, Schrödinger had the same problem with a cat, but that’s for another discussion.

However, this helped engineers discover the trick to implementing crisis theory into products. All they had to do was imagine the situation was not in crisis during the design, development and testing phases. Ignorance was bliss. They ignored cost and schedule. Overruns were laughed at and common sense was banished.

Only when the product was ready for release did they let the crisis field take over. As the product launch date approached the entered mindless-panic mode. This last phase was originally known as “commiserating” but in these politically correct days it is referred to as “commissioning”.

But wouldn’t the product fail as soon as it was commissioned?

Very true. In some of the early attempts that was exactly what happened. Clearly something else was required to control the outbreak of the crisis field to times of greatest intensity.

Around 1930, engineers realized they didn’t have to create the crisis themselves, they could use the fields generated by the user. All they had to do was measure the criticality of whatever the user was doing, then when it reached a peak level, trigger the pent up crisis energy.

This gave birth to what is now known as the Criticality Detector.

So what do these detector things look like?

Ah well, early models were large. The example, below, is a typical 1930s scene showing a lab demonstration of a criticality detector to a group of investors.

b4.gif

Needless to say the demonstration was a failure. Reality was distorted to such an extent, the investors forgot why they were there, ordered two cups of tea, a plate of biscuits, and left. There are even hazy reports that one of the investors married a bus conductor on the journey home.

Not a resounding success then?

No. Clearly, if the engineers were to keep themselves employed, a different approach would be required. The answer came in the form of the TN432X-B (below). This remarkable machine could direct crisis fields far enough away that the investors didn’t order tea or marry strangers.

mt1.jpg

This was the machine that helped engineers put criticality detection into the (secret) mainstream.

So, how did this thing know a crisis was occurring?

Deciding when the crisis had reached sufficient proportions presented significant challenges. If the devices failed too early it would be repaired and work through the main crisis. Too late and people would regard the device as reliable enough to not need replacement. Even in the TN432X-B the device was unwieldy (see the shot, below, which I smuggled out from the Smithsonian Secret Archives).

sidemanrotor.JPG

Further development yielded the much more compact MRIG (Miniature Relativity Insignificance Generator). Obviously, the output of this device needed inverting before being used to trigger the critical event.

sidemanswitch.JPG

The first commercially introduced Criticality Detector was used in a Snoring and Bore gramophone player.

People have long wondered why the gramophone was the first to see the user of Criticality Detectors. The answer is obvious to anyone who’s ever consider the combination of soft music, a bottle of Bordeaux, a member of the opposite sex … and the social skills of an engineer.

It’s not unsurprising that the failure mode for these gramophones was spontaneous combustion. In fact it’s where the phrase “crash and burn” started.

I’m reading you on FM with that one.

Sorry, I don’t give romantic advice, so I’m going to skip that comment.

The picture below represents the only surviving example of the main components of the Snoring and Bore Criticality Detector. On the left are the original components, on the right are the more modern equivalents.

From the collection of the Very Reverend RH Simtherington-Mallard

Why are they in a cheapo plastic box? Aren’t they valuable? Aren’t they treasured artifacts?

Ah ha. That may look like a cheapo plastics box, but, it’s no ordinary cheapo plastic box. That plastic box used to hold a matched pair of pink teddy bears, objects so far from a crisis that the Crisis Detector inside if fooled into its safe, ignorance-is-bliss, state.

Cunning. So, now I know about these things, can I take them out of everything I own?

No.

Ohhhhhhhh, I get it. You’re saying “no” because you’re an engineer. You want things to keep failing so we’ll buy the “new and improved” versions.

No, again. The problem is that the secret started to get out. Friendly engineers would help people out by removing these things. These kind meaning engineers threatened the whole engineering profession. A different approach was required.

First of all they tried miniaturization. The results were incredibly successful. The picture below shows a cluster of 9000 devices at a 1000X magnification.

Voila_Capture15.png

Wait a minute! Are you sure they’re there?

Yep, all 9000 of them, but even these small devices could be removed by the patient.

Something else was needed. Engineers found it in a landmark 1950s paper by Alan Turing: Computing Machinery and Crises. In this long forgotten paper, Turing proposed that humans would be unable to differentiate between human crises and those that were created sufficiently advanced software.

In the 1950s, no-one took the idea seriously, since software was so crude. But on August 12, 1981, all that changed. The introduction of the IBM PC (with Intel inside) heralded the Golden Age of Criticality Detectors.

Freed from the restraints of mere hardware, engineers were able to bring forward the science and art of criticality detection and crises in general. The seemingly mundane could be elevated to blood-vessel bursting proportions by the simplest misplaced comma, register or bit.

Within months, hardware implementations of criticality detectors had all but vanished. Software was all the rage. In leaps and bounds software spread to other devices. Printers, keyboards, radios, televisions, the list was endless.

Who hasn’t had a printer refuse to print as their end of term paper is due?

Or a scanner that will only scan in sepia?

Or a website that works perfectly for everyone else?

I had a friend swear that the paper on their greatest discovery had disappeared, only to find it the day after some guy from Poland published the equivalent work.

Entirely believable. The internet has allowed Criticality Detection to be spread worldwide.

And my last girlfriend dumped me because she didn’t get my text telling her I’d be late.

Yeah, phones are not immune, either.

Criticality Detection became so powerful that Governments the world over started to investigate.

That doesn’t sound good.

It wasn’t. Engineers narrowly averted having their greatest creations being banned by introducing two simple ideas:

  1. Drivers. Users have come to accept that everything associated with a computer needs a driver. This incredible piece of marketing managed to hide the fact the the main purpose of this software was to drive the users blood pressure.
  2. Updates. Updates are the backbone of modern criticality detection. As soon as users develop “workarounds,” companies can automatically updated the “drivers” to bring back the original problems or introduce new ones.

Fiendish.

Engineers prefer to think of this as astute business practice and the marketing people wax lyrical about empowering the dreams of the user.

Still, sounds a little mean.

I think they’re still ticked about the bottle of Bordeaux and member of the opposite sex thing.

So, they’re going to keep doing this?

Probably. I mean, they’ve got to keep themselves employed somehow.

Armed with these tools, engineers have taken “crisis management” to new highs. Printing an end of year report has been known to reduce life expectancy by years. Filing taxes electronically has reduced grown men to tears. Even defrosting fridges has become a black art in which the user must divert attention away from their real intention before diving for the defrost switch.

But computers are the biggest offenders.

Phew. I’m glad I have a Linux box.

Why, doesn’t that have problems?

Nope, not one. It works every day. Day in, day out. Every time.

Here, I’ll prove it. I just press this switch and …

hum, its never done that before.

Ah ha! I rest my case.

No, wait, wait, no, no, stop, stop. *clicks power switch and sighs*

Problems?

Er, no. It’ll just take a while. Never done that before though. Really.

Kind of brings us back to Peter’s hard drive crash, doesn’t it?

Yeah, it’s weird. Him, and now me. Do you have his number?

Sure. I hear he’s back in business.

So, his Windows article will be along soon.

As long as no one has a crisis.

But what about our readers? Have their hard drives crash? Do their printers print Greek characters (even when they’re not Greek)?

Do strange things happen with their iPhone? Or, electrical things stop working as soon as they touch them?

Does their IT department beg them not to touch anything with a wire coming out of it?

All good questions. I don’t know, but they do. Maybe they’ll let us know 🙂

Cheers.

Where Should I Send It?

 

You’ll also be added to my Readers’ Group, and be the first to know when I have other free stuff to give away.

 

No spam, and you can unsubscribe at any time. Promise.

To prevent spam, please check your inbox and confirm your email address.