Is Intelligent Design Really So Far-Fetched?

You’re one hour into the redeye from San Diego to Chicago. It’s a cramped, noisy, and bumpy flight. You’re stuck in 27B, center seat. The person on your left, in the window seat, has spent most of the evening in a bar or the smoking lounge from the smell of it. The person on your right, obviously a non-frequent flier, has stuffed a huge duffle bag partially under her seat–it has bulged out to take over most of your leg room–and now she’s spread open the New York Times and has violated your private space by at least eight inches.


You’re one hour into the redeye from San Diego to Chicago. It’s a cramped, noisy, and bumpy flight. You’re stuck in 27B, center seat. The person on your left, in the window seat, has spent most of the evening in a bar or the smoking lounge from the smell of it. The person on your right, obviously a non-frequent flier, has stuffed a huge duffle bag partially under her seat --it has bulged out to take over most of your leg room—and now she’s spread open the New York Times and has violated your private space by at least 8 inches.

Never mind. It was a long day and the airport hotdog dinner is starting to settle down.  You make yourself as small as possible.  Stretch out as much as possible, close your eyes and hope, not only can you catch a few winks, but when you wake up neither of your new companions will have their head on your shoulder.

Then suddenly the cabin lights come on. From the cockpit: “Would any passenger on this plane happen to have a spare keyboard and mouse?  Mine quit working and until I get a new one, we’re stuck on this course and altitude.  USB or Bluetooth is fine.”

Absurd. Not the first part, but the second. Of course, nobody in their right mind would design a commercial airliner where the primary, or even secondary, flight controls are a keyboard and mouse, and we doubt the pilot/plane user interface is built on Microsoft .NET GUI standards.

But we do build our EHR systems on .NET and expect doctors to add keyboards and mice to their black-bag -- although nowhere in the Hippocratic Oath do I see “right click.”

I’m not arguing against automating the practice of medicine, and I do not think for a second that electronic capture and retrieval of information is a step back from paper.  But is there an alternative to keyboard, mouse, drop-down menus, dialog boxes and the whole .NET programming paradigm? Of course, there is.

When automating any human process there are three fundamental design choices: 1) mimic as closely as possible the manual human process even if this means the automated system is nothing like anything you’ve developed before; 2) deep dive into the human process and introduce new technology--even if you have to invent it from scratch--if it improves the human process in terms of increasing productivity and reducing error; or 3) introduce new, foreign, and inappropriate technology to the human process because, as a programmer, it’s the only thing you know-- and it’s cheap.

When it comes to EHRs, unfortunately door No. 3 has been the clear winner!

It would be OK, I suppose, if a doctor using an EHR did not have to change anything in their daily routine.  The application, probably using handwriting recognition and dictation-to-text, would mimic paper record keeping. 

But why not go further and try to create efficiencies?  For example, a usability-enhanced EHR could detect the changes in diagnostic levels in blood tests and make recommendations approved with a tap or dismissed with a swipe.  If a patient’s Rx was running low--something a computer can calculate if it knows the parameters of the Rx (total number of pills prescribed / number of pills taken per day) and a calendar (elapsed time since last prescription), a usability-enhanced EHR should be able to pre-fill the Rx by one-tap approval and send it to the pharmacy.  

During a recent HIMSS-sponsored seminar on EHR usability, a question was asked about better designs that ditch the keyboard and mouse.  The moderator and presenter had quite a giggle about how one doctor wanted the EHR to “read their minds.”  Well maybe the concept of reading a doctors’ mind is funny to some, but the usability concept of anticipating the next step based on a previous pattern of behavior is not only not far-fetched, it should be a fundamental underlying principle of good design.  Why should a doctor have to grab a mouse or keyboard and navigate drop-down menus and dialog boxes if the next step is obvious?

It’s not like doing good design is rare or impossible.  There are many resources to draw on.  The University of Central Florida has experts on advanced simulation.  Game designers are darn good at making computers do all sorts of things without keyboards and mice.  UPS automated its entire package delivery process without the drivers dragging around a PC.

The same type of technology, and more important design philosophy, will come to EHR eventually.  And we do see some glimmers of hope in the smaller vendors.  But right now the old saying that “if all you have is a hammer everything looks like a nail” seems to be the result of Microsoft Certified EHRs.

Pilot to co-pilot: “right-click on the landing gear icon.  Enter your user name and password in the dialog box.  Scroll to the drop-down menu to lower the gear.”

Rob Tholemeier is a research analyst for Crosstree Capital Management in Tampa, Fla., covering the heath I.T. industry. He has over 25 years experience as an information technology investor, research analyst, investment banker and consultant, after beginning his career as a hardware engineer and designer