The chief investigator for the presidential panel investigating the Gulf of Mexico oil spill announced that the panel has found no proof that anyone at British Petroleum had sacrificed safety to save money. “To date we have not seen a single instance where a human being made a conscious decision to favor dollars over safety,” according to Fred Bartlit Jr. Rather, a series of mechanical and human failures created a chain of errors the ultimately led to the disaster, Bartlit summarized during a presentation before the panel.

This might come as a surprise after months of unremittingly harsh media coverage that has vilified BP and its executives (I think I read somewhere they drowned puppies during board meetings, just for kicks). You’d have to assume after the onslaught that someone, at some level in the company, completely disregarded human life and ecological threats to save a couple bucks.

But here’s what I find truly interesting about the Gulf spill fallout … absent any smoking gun, the presidential panel, along with an army of civil and criminal litigators, are homing in on the corporate culture at BP and asking what role it played in the disaster. How did the culture influence the chain of decisions (and mistakes) that led to the rig explosion and subsequent oil spill? How did that culture drive the thinking of individuals faced with various risks and rewards? The strengths or deficiencies of the corporate culture have become multibillion dollar questions.

Culture comes up a lot during conversations with health care executives—and I have to admit, I’ve found myself more than once stifling a yawn while a CEO or CIO droned on about how culture change saved the day. “Culture” and “mission” can seem so nebulous and imprecise, especially during conversations about technology, that it’s hard to link those words to actions, absent any concrete examples. But I’ve also found culture to be fascinating when it’s taken seriously: it can be an indomitable force that affects every decision made--and the response to any risk encountered--by an organization.

Which leads to an observation about the BP coverage: you could easily interchange “faulty cement job,” “failed blowout preventer” and “incorrectly read pressure test” with a number of phrases that crop up in medical error reports: “wrong dosage,”  “misdiagnosis” and “iatrogenic infections,” to same but a few.

If it’s a question of culture at BP, it’s a question of culture in hospitals. A “culture of accountability” or a “culture of safety” doesn’t mean much without evidence that decisions are guided by that professed culture. And since I.T. is now such a large presence in clinical care, that culture should encompass not only the practice of medicine but the “practice” of technology.

But does it? That’s the question I put to you. There are CPOE implementations where the primary objective has been to not annoy physicians with so many alerts. Electronic health records that, to increase adoption, were installed with numerous workarounds and overrides that enable clinicians to sidestep processes they feel slow them down and aren’t medically necessary. Data security policies, procedures and software implementations that pay lip service to privacy and security regulations but in reality wouldn’t stop a janitor from perusing a confidential medical file.

Is this in line with the company culture, or are these aberrations? In the case of CPOE, you could argue that software is part of a culture of safety, because CPOE sacrifices convenience for safety—I’m guessing the vast majority of physicians would prefer to write orders down on a slip of paper and hand it to someone, rather than take the extra time to log it into a computer. But CPOE has been around for a long time, and not many providers really tackled it in earnest until the meaningful use incentive program came about. I think that speaks volumes about how culture is perceived in some corners.

A culture--be it culture of excellence, a culture of safety, a culture of service--taken seriously becomes an identity, and that identity becomes lodged in the professional DNA of those who share it.

By example, I’ll point to the University of Michigan Health System, which for nearly a decade has disclosed medical errors to patients and family members, and offered compensation when employees are found at fault. Instead of building a wall of silence around its mistakes, a common strategy that ensures that most of them will not be addressed clinically or via information technology, the health system decided to be honest with itself and its patients.

Now, what does that tell you about its culture? What does that say to patients--and employees?  The connections between culture and action seem pretty solid. I’m hoping that today, in your office, in a meeting or exam room, you’re making them yourself.

Greg Gillespie is the Editor-in-Chief at Health Data Management. He can be reached at greg.gillespie@sourcemedia.com.

 

Register or login for access to this item and much more

All Health Data Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access